Tuesday, April 22, 2003

Randomness as a professional disease



In the May/June 2003 issue of Computing in Science and Engineering, Dietrich Stauffer has an article on Sociophyics Simulations (p. 71):


"Statistical physicists, of course, assume that these hierarchies arise from randomness (The illusion that everything is random is a professional disease - morbus Boltzmann - among such physicists, just as silicosis, or black lung, affects mine workers)"

Saturday, April 19, 2003

Generating trial moves cheaply



Today's article is "Monte Carlo simulations using sampling from an approximate potential" by Lev Gelb (JCP 118, 7747). He describes using an approxmiate potential to make a series of (cheap) moves and then accept or reject that series of moves based on the desired (correct, expensive) potential. This is similar to (but not quite the same as) Multilevel Monte Carlo in other work. The focus of multilevel sampling is early rejection of ultimately undesirable trial moves. The focus of Gelb's method seems to be creating uncorrelated trial moves to evaluate with the expensive potential.


I'm a little disappointed that the author didn't reference the similar multilevel Monte Carlo used in PIMC (RMP 67, 279) (see section V.F, p. 329) and VMC (JCP 113, 5123).


On page 7749, he mentions that an empirical potential could be used as the approximate potential and an ab initio potential for the correct potential. This has been done, sort of. (Also in Recent Advances in Quantum Monte Carlo Methods II (World Scientific 2002) or here .) I called the technique "pre-rejection". Consider using Gelb's method, but make only a single trial move using the approximate potential. If that trial move is rejected, then the energy change is clearly zero, and we know the move will be accepted without having to evaluate the expensive potential. This shortcut can be extended to more trial moves, but its value decreases dramatically as the probability that all the trial moves (with the approximate potential) will be rejected becomes very small.


Gelb applied his method to a Lennard-Jones fluid where the potential is cut and shifted. The approximate potential is the potential with a short cutoff and the correct potential has a larger cutoff.


Table I shows speedups and energies (and pressures and heat capacities) with error bars. There seems to a tradeoff - more speed up lowers the acceptance ratio and leads to larger error bars. This can be quantified by computing the efficiency 1/ (T * sigma^2) (or Speedup/sigma^2). Computing efficiencies for system 1 gives








System Efficiency of Energy (x104)
1(ref) 12
1(a) 14
1(b) 19
1(c) 14
1(d) 25

From the paper, it looks like a,b and c,d have the same parameters of relevance (rc and M) and that this method definitely does increase the efficiency.

Thursday, April 17, 2003

The Quest for Better Wavefunctions


One of the nice features of QMC is the freedom in the choice of wavefunction. The downside is it can be time consuming and unwieldy to optimize them in VMC.


In a recent article - Backflow Correlations for the Electron Gas and Metallic Hydrogen - Holzmann, et al. look at improving wavefunctions through adding analytic information.


The section with the Feynman-Kacs approach uses a cumulant expansion. In case you (read: me) need a reminder of the cumulant expansion, look at this online statistics text (chapter 8.4).

Wednesday, April 09, 2003

Followup to "Testing Monte Carlo"



I wrote a program that uses Simpson's rule to compute the energy of two particles in a box interacting via a Lennard-Jones potential


The web page and program are here

It runs in a second or so on my 1 GHz P3. I expect 3 particles should be doable.


Edit 9/4/2004, fixed the link.

Monday, April 07, 2003

Testing Monte Carlo


You've just written a brand new Monte Carlo code. It's producing vast amounts of data on your fast new computer. How do know if that random looking time series will average to a beautiful result and lead to deep insight, or is it simply random noise from the physics of some other universe?


There are several techniques we can use to verify that the codes we write are working correctly.


  1. Unit testing
    This is standard software engineering (or *should* be standard). This involves small drivers to test individual classes or routines. It's good for verifying small, easily understood pieces before integrating into a larger system and for regression testing (making sure the latest change didn't break something).
  2. Solve simple systems with analytic or well-known answers.
    The harmonic oscillator is popular. For QMC, H2 is very well known, and very accurate answers are known from other techniques.
  3. Compute internal quantities in multiple ways.
    The best example in QMC is the local energy - compute the derivatives analytically and numerically. In PIMC there are multiple energy estimators.
  4. Compute system values with another integration method.
    It seems that computers are fast enough that we should be able to use grid-based methods to compute answers for small numbers (1-3) of particles.
    For QMC, shutting off the electron-electron interaction simplifies the problem, and Mathematica can handle the resulting integrals (numerically, not symbolically)
  5. Comparison with literature
    Compare with experiment and answers obtained by other methods.